931 research outputs found

    The benefits of synchronous collaborative information visualization: evidence from an experimental evaluation

    Get PDF
    A great corpus of studies reports empirical evidence of how information visualization supports comprehension and analysis of data. The benefits of visualization for synchronous group knowledge work, however, have not been addressed extensively. Anecdotal evidence and use cases illustrate the benefits of synchronous collaborative information visualization, but very few empirical studies have rigorously examined the impact of visualization on group knowledge work. We have consequently designed and conducted an experiment in which we have analyzed the impact of visualization on knowledge sharing in situated work groups. Our experimental study consists of evaluating the performance of 131 subjects (all experienced managers) in groups of 5 (for a total of 26 groups), working together on a real-life knowledge sharing task. We compare (1) the control condition (no visualization provided), with two visualization supports: (2) optimal and (3) suboptimal visualization (based on a previous survey). The facilitator of each group was asked to populate the provided interactive visual template with insights from the group, and to organize the contributions according to the group consensus. We have evaluated the results through both objective and subjective measures. Our statistical analysis clearly shows that interactive visualization has a statistically significant, objective and positive impact on the outcomes of knowledge sharing, but that the subjects seem not to be aware of this. In particular, groups supported by visualization achieved higher productivity, higher quality of outcome and greater knowledge gains. No statistically significant results could be found between an optimal and a suboptimal visualization though (as classified by the pre-experiment survey). Subjects also did not seem to be aware of the benefits that the visualizations provided as no difference between the visualization and the control conditions was found for the self-reported measures of satisfaction and participation. An implication of our study for information visualization applications is to extend them by using real-time group annotation functionalities that aid in the group sense making process of the represented data

    It is all me: the effect of viewpoint on visual-vestibular recalibration

    Get PDF
    Participants performed a visual–vestibular motor recalibration task in virtual reality. The task consisted of keeping the extended arm and hand stable in space during a whole-body rotation induced by a robotic wheelchair. Performance was first quantified in a pre-test in which no visual feedback was available during the rotation. During the subsequent adaptation phase, optical flow resulting from body rotation was provided. This visual feedback was manipulated to create the illusion of a smaller rotational movement than actually occurred, hereby altering the visual–vestibular mapping. The effects of the adaptation phase on hand stabilization performance were measured during a post-test that was identical to the pre-test. Three different groups of subjects were exposed to different perspectives on the visual scene, i.e., first-person, top view, or mirror view. Sensorimotor adaptation occurred for all three viewpoint conditions, performance in the post-test session showing a marked under-compensation relative to the pre-test performance. In other words, all viewpoints gave rise to a remapping between vestibular input and the motor output required to stabilize the arm. Furthermore, the first-person and mirror view adaptation induced a significant decrease in variability of the stabilization performance. Such variability reduction was not observed for the top view adaptation. These results suggest that even if all three viewpoints can evoke substantial adaptation aftereffects, the more naturalistic first-person view and the richer mirror view should be preferred when reducing motor variability constitutes an important issue

    A Goal-based Framework for Contextual Requirements Modeling and Analysis

    Get PDF
    Requirements Engineering (RE) research often ignores, or presumes a uniform nature of the context in which the system operates. This assumption is no longer valid in emerging computing paradigms, such as ambient, pervasive and ubiquitous computing, where it is essential to monitor and adapt to an inherently varying context. Besides influencing the software, context may influence stakeholders' goals and their choices to meet them. In this paper, we propose a goal-oriented RE modeling and reasoning framework for systems operating in varying contexts. We introduce contextual goal models to relate goals and contexts; context analysis to refine contexts and identify ways to verify them; reasoning techniques to derive requirements reflecting the context and users priorities at runtime; and finally, design time reasoning techniques to derive requirements for a system to be developed at minimum cost and valid in all considered contexts. We illustrate and evaluate our approach through a case study about a museum-guide mobile information system

    Designing and Implementing an Assessment Plan for a Virtual Engineering Lab

    Get PDF
    This article describes the process of creating, implementing, and assessing an innovative learning tool. The game based laboratory simulation, “Gaming for Applied Materials Engineering” (GAME), incorporated into the Engineering curriculum at a large public university, is intended to facilitate the same learning previously taught in a traditional hands-on laboratory. Through this technological tool, researchers hope to extend an integral learning opportunity to students currently unable to access physical labs, as well as, to augment and reinforce the material taught to those currently enrolled in physical lab courses. Throughout the article, the research team discusses the assessment methodology, describes several challenges overcome, and offers recommendations for others interested in utilizing game-based technology in educational settings.  

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Preliminary archaeoentomological analyses of permafrost-preserved cultural layers from the pre-contact Yup’ik Eskimo site of Nunalleq, Alaska : implications, potential and methodological considerations

    Get PDF
    Acknowledgements Site excavation and samples collection were conducted by archaeologists from the University of Aberdeen, with the help of archaeologists and student excavators from the University of Aberdeen University of Alaska Fairbanks and Bryn Mawr College, Kuskokwim Campus, College of Rural Alaska and residents of Quinhagak and Mekoryuk. This study is funded through AHRC grant to the project ‘Understanding Cultural Resilience and Climate Change on the Bering Sea through Yup’ik Ecological Knowledge, Lifeways, Learning and Archaeology’ to Rick Knecht, Kate Britton and Charlotta Hillderal (University of Aberdeen; AH/K006029/1). Thanks are due to Qanirtuuq Inc. and Quinhagak, Alaska for sampling permissions and to entomologists working at the CNC in Ottawa for allowing access to reference collections of beetles, lice and fleas. Yves Bousquet, Ales Smetana and Anthony E. Davies are specially acknowledged for their help with the identification of coleopteran specimens. Finally, we would also like to thank Scott Elias for useful comments on the original manuscript.Peer reviewedPublisher PD
    corecore